MSC4276: Soft unfailure for self redactions#4276
Conversation
There was a problem hiding this comment.
Implementation requirements:
- Server
| When a user is removed from a room, the server may issue several redactions to their messages to clean | ||
| up rooms. Users may also use similar functionality, if supported by their server, to remove their own | ||
| messages after being removed. |
There was a problem hiding this comment.
What is the context for a remote server or remote user redacting their own messages?
There was a problem hiding this comment.
Some servers apply redactions as an erasure technique.
There was a problem hiding this comment.
But why are the servers erasing messages? Can this please be added to the context in the MSC?
| When a user is removed from a room, the server may issue several redactions to their messages to clean | ||
| up rooms. Users may also use similar functionality, if supported by their server, to remove their own | ||
| messages after being removed. |
There was a problem hiding this comment.
What does remove mean explicitly? Does it mean kicked, banned or both?
There was a problem hiding this comment.
Can this please be made explicit in the text?
| When a user is removed from a room, the server may issue several redactions to their messages to clean | ||
| up rooms. Users may also use similar functionality, if supported by their server, to remove their own | ||
| messages after being removed. |
There was a problem hiding this comment.
If it means kicked or banned, then is good faith being assumed in either remote users and servers?
There was a problem hiding this comment.
Ok, I guess the context is something like this: a user has registered on a server for use as a spam/brigade throwaway. This user has been banned by a bunch of remote servers. The server admins discover the throwaway and send redactions from the same user (hijacking the account) to clean and try undo some of the damage. In the rooms where the user has been banned, the redactions will have to use prior state to authorize and soft fail.
So I guess "good faith" here means both discovery of the user throwaway by server admins, and then also their cooperation. But that's not the right way to frame the situation. This is really a proposal to assist server admins in cleaning up accounts that have violated their tos.
There was a problem hiding this comment.
I'm not following the concern here, sorry.
| * Only the redactions received within 1 hour of the most recent membership event change can bypass | ||
| soft failure. |
There was a problem hiding this comment.
What tooling is expected to be deployed by remote users or servers such that they send redactions within 1 hour?
There was a problem hiding this comment.
This is to deal with possible federation delays, not tooling delays. The redactions may take a little while to send.
There was a problem hiding this comment.
I guess the problem i'm trying to communicate to you here is that if the user parts from a room an hour before the server admin responsible for them discovers that the account was used for abuse then the redactions will be soft failed.
There was a problem hiding this comment.
Additionally, not all server admins respond within 1 hour, the user may already have been banned hours and hours before addressed by a server admin, potentially leaving soft failed media (hi, race conditions) in place for other servers in the room.
| ## Alternatives | ||
|
|
||
| Another approach could be to modify auth rules to exempt same-sender `m.room.redaction` events from the requirement | ||
| to pass authorization at the current resolved state. This approach may not work well with [how redactions work](https://spec.matrix.org/v1.13/rooms/v11/#handling-redactions). |
There was a problem hiding this comment.
MSC4194 is an alternative that does not require good faith in remote users and servers. For use by room moderators.
There was a problem hiding this comment.
MSC4194 requires this MSC in order to work - it's not an alternative.
There was a problem hiding this comment.
MSC4194 is for use by room moderators, using the room moderator to send redactions, not the target user. So does not require this MSC because the redaction events all use valid auth state. This is true even when redactions are sent for events that locally have been soft failed.
| due to the authorization rules preventing the redaction event from being validated, despite being part of the | ||
| DAG at a legal point. | ||
|
|
||
| This proposal suggests that servers be less strict about soft failing self-redactions in particular. |
There was a problem hiding this comment.
I think servers should be more strict about self-redactions due to legal requirements, room policy (it would suck if someone will remove useful information or frame someone by removing context), or moderation purposes (reporting messages after ban, machine learning, etc).
|
While this solution works, it opens up doors to undesirable effects:
For the specific use case of cleaning up spam in a room, especially after a ban, MSC4194-style endpoints may wish to support using a room moderator's account instead of the subject user to send redactions. MSC4293 may also be of interest to some communities. For the cases where the server is trying to be a good citizen and cleaning up spam in all rooms the user posted to, the server may wish to withhold the leave event (from deactivation or similar) until the user's server-sent self redactions are fully processed. I'm closing this MSC in favour of those alternative approaches. |
Rendered
Author: @matrix-org/trust-safety
Shepherd: @matrix-org/trust-safety
Conflict of interest disclosure: Many members of the T&S team are Element employees and may serve additional roles outside their primary responsibility. Not all members of the T&S team are publicly known and disclosing all conflicts may reveal their identities - they have been excluded for this reason.